Parallel Python

Part 2: Pool

One of the core multiprocessing features is multiprocessing.Pool. This provides a pool of workers that can be used to parallelise a map.

For example, create a new script called pool.py and type into it:

pool.py
from functools import reduce
from multiprocessing import Pool, cpu_count

def square(x):
    """Function to return the square of the argument"""
    return x * x

if __name__ == "__main__":
    # print the number of cores
    print(f"Number of cores available equals {cpu_count()}")

    # create a pool of workers
    with Pool() as pool:
        # create an array of 5000 integers, from 1 to 5000
        r = range(1, 5001)

        result = pool.map(square, r)

    total = reduce(lambda x, y: x + y, result)

    print(f"The sum of the square of the first 5000 integers is {total}")

Run the script using the command in the Terminal:

$
python pool.py
Number of cores available equals 4
The sum of the square of the first 5000 integers is 41679167500

(the number of cores will depend on the number available on your machine)

So how does this work? The line

with Pool() as pool:

has created a pool of worker copies of your script, with the number of workers equalling the number of cores reported by cpu_count(). You can control the number of copies by specifying the value of processes in the constructor for Pool, e.g.

with Pool(processes=5) as pool:

would create a pool of five workers.

The line

r = range(1,5001)

is a quick way to create a list of 5000 integers, from 1 to 5000. The parallel work is conducted on the line

result = pool.map(square, r)

This performs a map of the function square over the list of items in r. The map is divided up over all of the workers in the pool. This means that, if you have 10 workers (e.g. if you have 10 cores), then each worker will perform only one tenth of the work (e.g. calculating the square of 500 numbers). If you have 2 workers, then each worker will perform only half of the work (e.g. calculating the square of 2500 numbers).

The next line

total = reduce(lambda x, y: x + y, result)

is just a standard reduce used to sum together all of the results.

You can verify that the square function is divided between your workers by using a multiprocessing.current_process().pid call, which will return the process ID (PID) of the worker process. Edit your pool.py script and set the contents equal to:

pool.py
from functools import reduce
from multiprocessing import Pool, current_process

def square(x):
    """Function to return the square of the argument"""
    print(f"Worker {current_process().pid} calculating square of {x}")
    return x * x

if __name__ == "__main__":
    nprocs = 2

    # print the number of cores
    print(f"Number of workers equals {nprocs}")

    # create a pool of workers
    with Pool(processes=nprocs) as pool:
        # create an array of 5000 integers, from 1 to 5000
        r = range(1, 21)

        result = pool.map(square, r)

    total = reduce(lambda x, y: x + y, result)

    print(f"The sum of the square of the first 5000 integers is {total}")

Run this script using

$
python pool.py
Number of workers equals 2
Worker 31089 calculating square of 1
Worker 31089 calculating square of 2
Worker 31089 calculating square of 3
Worker 31090 calculating square of 4
Worker 31090 calculating square of 5
Worker 31090 calculating square of 6
Worker 31089 calculating square of 7
Worker 31089 calculating square of 8
Worker 31089 calculating square of 9
Worker 31090 calculating square of 10
Worker 31090 calculating square of 11
Worker 31089 calculating square of 13
Worker 31090 calculating square of 12
Worker 31089 calculating square of 14
Worker 31089 calculating square of 15
Worker 31090 calculating square of 16
Worker 31090 calculating square of 17
Worker 31090 calculating square of 18
Worker 31089 calculating square of 19
Worker 31089 calculating square of 20
The sum of the square of the first 5000 integers is 2870

(the exact PIDs of the workers, and the order in which they print will be different on your machine)

You can see in the output that there are two workers, signified by the two different worker PIDs. The work has been divided evenly amongst them.

Exercise

Edit pool.py and change the value of nprocs. How is the work divided as you change the number of workers?

Using multiple pools in a single script

You can use more than one multiprocessing.Pool at a time in your script, but you should ensure that you use them one after another. The way multiprocessing.Pool works is to fork your script into the team of workers when you create a Pool object. Each worker contains a complete copy of all of the functions and variables that exist at the time of the fork. This means that any changes after the fork will not be held by the other workers.

If you made a Python script called broken_pool.py with the contents:

broken_pool.py
from multiprocessing import Pool

def square(x):
    """Return the square of the argument"""
    return x * x

if __name__ == "__main__":

    r = [1, 2, 3, 4, 5]

    with Pool() as pool:
        result = pool.map(square, r)

        print("Square result: {result}")

        def cube(x):
            """Return the cube of the argument"""
            return x * x * x

        result = pool.map(cube, r)

        print("Cube result: {result}")

and ran it you would see an error like:

AttributeError: Can't get attribute 'cube' on <module '__main__' from 'broken_pool.py'>

The problem is that pool was created before the cube function. The worker copies of the script were thus created before cube was defined, and so don’t contain a copy of this function. This is one of the reasons why you should always define your functions above the if __name__ == "__main__" block.

Alternatively, if you have to define the function in the __main__ block, then ensure that you create the pool after the definition. For example, one fix here is to create a second pool for the second map:

pool.py
from multiprocessing import Pool

def square(x):
    """Return the square of the argument"""
    return x * x

if __name__ == "__main__":

    r = [1, 2, 3, 4, 5]

    with Pool() as pool:
        result = pool.map(square, r)

        print(f"Square result: {result}")

    def cube(x):
        """Return the cube of the argument"""
        return x * x * x

    with Pool() as pool:
        result = pool.map(cube, r)

        print(f"Cube result: {result}")

Running this should print out

$
python pool.py
Square result: [1, 4, 9, 16, 25]
Cube result: [1, 8, 27, 64, 125]